• Àüü
  • ÀüÀÚ/Àü±â
  • Åë½Å
  • ÄÄÇ»ÅÍ
´Ý±â

»çÀÌÆ®¸Ê

Loading..

Please wait....

±¹³» ³í¹®Áö

Ȩ Ȩ > ¿¬±¸¹®Çå > ±¹³» ³í¹®Áö > Çѱ¹Á¤º¸°úÇÐȸ ³í¹®Áö > Á¤º¸°úÇÐȸ ÄÄÇ»ÆÃÀÇ ½ÇÁ¦ ³í¹®Áö (KIISE Transactions on Computing Practices)

Á¤º¸°úÇÐȸ ÄÄÇ»ÆÃÀÇ ½ÇÁ¦ ³í¹®Áö (KIISE Transactions on Computing Practices)

Current Result Document :

ÇѱÛÁ¦¸ñ(Korean Title) ¿ÜºÎ Áö½ÄÀÌ ¹Ý¿µµÈ BERT¸¦ È°¿ëÇÑ °Ë»ö ±â¹Ý ´ëÈ­ ½Ã½ºÅÛ
¿µ¹®Á¦¸ñ(English Title) External Knowledge Incorporated BERT for Retrieval-based Dialogue Systems
ÀúÀÚ(Author) ÇÑÀåÈÆ   °í¿µÁß   ¼­Á¤¿¬   Janghoon Han   Youngjoong Ko   Jungyun Seo  
¿ø¹®¼ö·Ïó(Citation) VOL 27 NO. 12 PP. 0549 ~ 0554 (2021. 11)
Çѱ۳»¿ë
(Korean Abstract)
Àΰ£°ú »óÈ£ÀÛ¿ëÇÒ ¼ö ÀÖ´Â ´ëÈ­½Ã½ºÅÛ °³¹ßÀº ÀΰøÁö´É ºÐ¾ßÀÇ Áß¿äÇÑ °úÁ¦ Áß ÇϳªÀÌ´Ù. ÀÌ·¯ÇÑ ¹®Á¦¸¦ ÇØ°áÇϱâ À§ÇØ ´ëÈ­ ½Ã½ºÅÛ¿¡¼­ ¿ÜºÎ Áö½ÄÀ» È°¿ëÇÏ´Â ¿¬±¸´Â ²ÙÁØÈ÷ ÁøÇàµÇ¾î ¿Ô´Ù. ÇÏÁö¸¸ ¿ÜºÎ Áö½ÄÀ» ÇнÀÇϱâ À§Çؼ­´Â ±¸Á¶È­µÈ Áö½ÄÀÌ ÇÊ¿äÇϸç, À̸¦ »ý¼ºÇϱâ À§Çؼ± »ó´çÇÑ ÀÚ¿øÀÌ ÇÊ¿äÇÏ´Ù. ÀÌ·¯ÇÑ °üÁ¡¿¡¼­ º» ¿¬±¸´Â °Ë»ö ±â¹Ý ´ëÈ­ ½Ã½ºÅÛ¿¡¼­ ±¸Á¶È­ µÇÁö ¾Ê´Â ÅؽºÆ®¸¦ ¿ÜºÎ Áö½ÄÀ¸·Î »ç¿ëÇÏ´Â ¸ðµ¨À» Á¦¾ÈÇÑ´Ù. ±âº» ¸ðµ¨·Î »çÀü ÇнÀµÈ ¾ð¾î ¸ðµ¨ÀÎ BERT¸¦ »ç¿ëÇÏ°í »çÈÄ ÇнÀÀ» ÅëÇØ ¸ðµ¨¿¡ ¿ÜºÎ Áö½ÄÀ» ÇнÀ½ÃŲ´Ù. ÀÌ ÈÄ »çÈÄ ÇнÀµÈ ¸ðµ¨À» ´ëÈ­ ÀÀ´ä ¼±Åà ŽºÅ©¿¡ ¹Ì¼¼Á¶Á¤ÇÏ¿© ¹®Á¦¸¦ ÇØ°áÇÑ´Ù. ±âÁ¸ BERT ¸ðµ¨¿¡ ºñÇØ ¿ÜºÎ Áö½ÄÀ» ÇнÀÇÑ ¸ðµ¨ÀÇ ¼º´ÉÀÌ ¿ìºÐÅõ ÄÚÆÛ½º¿¡¼­ R10@1 ±âÁØ 1.3% Çâ»óµÈ °á°ú¸¦ º¸¿´´Ù.
¿µ¹®³»¿ë
(English Abstract)
Developing dialogue systems is a crucial aspect of artificial intelligence. Various studies have been advanced to improve the performance of the dialogue systems using external knowledge. However, well-structured external knowledge is required in order to be applied to the dialogue systems, and it needs a considerable amount of time and resources to be generated. From this point of view, this study proposes a model that uses unstructured texts as external knowledge for retrieval-based dialogue systems. We use a pre-trained language model, BERT as a base model for the response selection task, and then we train BERT with the external knowledge through post-training. Eventually, the post-trained model is fine-tuned for the dialogue response selection task. Compared to existing BERT models, the performance of the post-trained model with external knowledge is improved by 1.3% in R10@1 on ubuntu corpus.
Å°¿öµå(Keyword) µö ·¯´×   ÀÚ¿¬¾î 󸮠  °Ë»ö ±â¹Ý ´ëÈ­ ½Ã½ºÅÛ   BERT   »çÈÄ ÇнÀ   ´ëÈ­ ÀÀ´ä ¼±Åà  deep learning   natural language processing   retrieval-based dialog system   BERT   post-training   response selection  
ÆÄÀÏ÷ºÎ PDF ´Ù¿î·Îµå